Estimating the Posterior Probabilities Using the K-Nearest Neighbor Rule
نویسنده
چکیده
In many pattern classification problems, an estimate of the posterior probabilities (rather than only a classification) is required. This is usually the case when some confidence measure in the classification is needed. In this article, we propose a new posterior probability estimator. The proposed estimator considers the K-nearest neighbors. It attaches a weight to each neighbor that contributes in an additive fashion to the posterior probability estimate. The weights corresponding to the K-nearest-neighbors (which add to 1) are estimated from the data using a maximum likelihood approach. Simulation studies confirm the effectiveness of the proposed estimator.
منابع مشابه
Software Cost Estimation by a New Hybrid Model of Particle Swarm Optimization and K-Nearest Neighbor Algorithms
A successful software should be finalized with determined and predetermined cost and time. Software is a production which its approximate cost is expert workforce and professionals. The most important and approximate software cost estimation (SCE) is related to the trained workforce. Creative nature of software projects and its abstract nature make extremely cost and time of projects difficult ...
متن کاملDiscriminant Adaptive Nearest Neighbor Classification
Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to estimate an effective metric for computing neighborhoods. We determine the local decision b...
متن کاملConvergence Rate of the Fuzzy Generalized Nearest Neighbor Rule
tizzy k nearest neighbor rule (k-NNR) has been applied in a variety of substantive areas. Yang and Chen [l] described a fuzzy generalized k-NN algorithm which is a unified approach to a variety of fuzzy k-NNR’s. They created the strong consistency of posterior risk of the fuzzy generalized NNR. In this paper, we give their convergence rate. That is, the convergence rate of posterior risk of the...
متن کاملA Statistical Confidence-Based Adaptive Nearest Neighbor Algorithm for Pattern Classification
The k-nearest neighbor rule is one of the simplest and most attractive pattern classification algorithms. It can be interpreted as an empirical Bayes classifier based on the estimated a posteriori probabilities from the k nearest neighbors. The performance of the k-nearest neighbor rule relies on the locally constant a posteriori probability assumption. This assumption, however, becomes problem...
متن کاملAdaptive Kernel Metric Nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 17 3 شماره
صفحات -
تاریخ انتشار 2005